Segmentation of Cluttered Scenes through Interactive Perception
نویسندگان
چکیده
For robot to perform its tasks competently, robustly and in the right context it has to understand the course of its actions and their consequences. For example, imagine the robot being tasked with the clean up of the breakfast table. The robot is confronted with a heavily cluttered scene and has to be able to tell waste, dirty, clean and valuable objects apart. The robot shall be equipped with the knowledge that will, for instance, stop it from throwing away an expensive item. Herein proposed approach elevates robot’s perception skills in that it utilizes its capabilities to interact with the clutter of objects. This allows for better segmentation and finally also better object recognition by means of constraining the recognition to a region or regions of interest. Similar to Katz et al. [1] and Bergstrom et al. [2], we propose a system that uses a robot arm to induce motions in a scene to enable effective object segmentation. Our system employs a combination of the following techniques: i) estimation of a contact point and a push direction of the robot’s end effector by detecting the concave corners in the cluttered scene, ii) feature extraction using features proposed by Shi and Tomasi and tracking using optical flow, and iii) a novel clustering algorithm to segment the objects. Segmentation of rigid objects from a video stream of objects being moved by the robot has been addressed by Fitzpatrick [3] and Kenney et al. [4]. In contrast, our arm motion is not pre-planned but adapts to the scene, we make use of the 3D data to segment the object candidates from the background and we use a novel clustering approach for the segmentation of textured objects. Overview of the whole system is shown in Fig. 2. The system will be demostrated live during the workshop.
منابع مشابه
Visual Perception: Converging Mechanisms of Attention, Binding, and Segmentation?
Visual scenes are cluttered. Recent evidence suggests that areas as early as V1 and V2 help making sense of the scene by segmenting them into distinct objects, separating foreground and background, and binding features.
متن کاملVisibility in three-dimensional cluttered scenes.
Three-dimensional (3D) cluttered scenes consist of a large number of small surfaces distributed randomly in a 3D view volume. The canonical example is the foliage of a tree or bush. 3D cluttered scenes are challenging for vision tasks such as object recognition and depth perception because most surfaces or objects are only partly visible. This paper examines the probabilities of surface visibil...
متن کاملRobust Segmentation of cluttered scenes using RGB-Z images
With the availability of commercial depth sensors such as the Kinect, there is a body of work using the depth information provided by the Kinect to segment images of scenes [1][6]. In this project we try to segment cluttered scenes by augmenting color information in images with depth cues from the Kinect data. In this milestone, our major focus was to study the literature in the RGBZ domain, im...
متن کاملSmartAnnotator: An Interactive Tool for Annotating RGBD Indoor Images
RGBD images with high quality annotations in the form of geometric (i.e., segmentation) and structural (i.e., how do the segments are mutually related in 3D) information provide valuable priors to a large number of scene and image manipulation applications. While it is now simple to acquire RGBD images, annotating them, automatically or manually, remains challenging especially in cluttered nois...
متن کاملGraphcut-based Interactive Segmentation using Colour and Depth cues
Segmentation of novel or dynamic objects in a scene, often referred to as background subtraction or foreground segmentation, is critical for robust high level computer vision applications such as object tracking, object classification and recognition. However, automatic realtime segmentation for robotics still poses challenges including global illumination changes, shadows, inter-reflections, c...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2012